New Times,
New Thinking.

  1. Election 2024
28 June 2024

The risk of deepfakes deciding elections is real

Trust in politicians is at record lows even without the menace of disinformation.

By Will Dunn

Earlier this year I realised, to my profound disappointment, that I hadn’t really been listening to Arnold Schwarzenegger. For a week I listened to a podcast called Arnold’s Pump Club, in which (I thought) the heavily muscled former governor of California talks about diet and exercise. I’ve always liked Arnie’s films and his voice filled the air during several sessions of loft-clearing. After a few episodes, however, something began to jar – it was weirdly repetitive, uncharacteristically flat. I checked the blurb from the podcast’s producers, which says the podcast is “by Arnold Schwarzenegger”, but that it is made possible “thanks to a helpful machine he trained”. So it’s not Arnold. It’s a large-language model (“AI”, if you insist) trained on Arnold’s voice and fed a script. “It’s jaaast a compuduh!”, I cried in my best Austro-Californian accent, my biceps visibly shrinking at the deception.

Last night, Channel 4’s Dispatches revealed a more serious downside to AI. In April it performed an experiment on 25 voters from Southend-on-Sea, all of whom described themselves as “undecided” when the testing began. These voters were told they were being surveyed for their reactions to “news and campaign material”; what they weren’t told was that researchers were mixing AI-generated disinformation and deepfakes into what they were seeing to push them towards voting for either Labour or the Conservatives. After watching videos in which deepfaked versions of Rishi Sunak and Keir Starmer announced fake policies or revealed their true (but actually not true) intentions, these voters placed mock ballots.

The results were striking: all but two of the voters went in the direction the deepfakes had pushed them, implying that AI’s success rate in manipulating floating voters could be as high as 92 per cent.

This is an excellent piece of journalism by Dispatches, which has led the way on coverage of deepfakes. There is a real appetite for using AI to change people’s minds in this way: a study by researchers at Google’s AI division, Deepmind, recently found that impersonation (deepfakes) was the most popular use of generative AI, and that “opinion manipulation” was the second most popular reason (after fraud) for doing so.  

A survey shared exclusively with the New Statesman today suggests the wider public is rightly concerned at the political influence of deepfakes: more than half of 18- to 34-year-olds (from 2,052 respondents across the UK) said they were worried that deepfakes could influence the outcome of the election, and 64 per cent said they were not confident they would be able to spot audio or video that had been faked.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
  • Administration / Office
  • Arts and Culture
  • Board Member
  • Business / Corporate Services
  • Client / Customer Services
  • Communications
  • Construction, Works, Engineering
  • Education, Curriculum and Teaching
  • Environment, Conservation and NRM
  • Facility / Grounds Management and Maintenance
  • Finance Management
  • Health - Medical and Nursing Management
  • HR, Training and Organisational Development
  • Information and Communications Technology
  • Information Services, Statistics, Records, Archives
  • Infrastructure Management - Transport, Utilities
  • Legal Officers and Practitioners
  • Librarians and Library Management
  • Management
  • Marketing
  • OH&S, Risk Management
  • Operations Management
  • Planning, Policy, Strategy
  • Printing, Design, Publishing, Web
  • Projects, Programs and Advisors
  • Property, Assets and Fleet Management
  • Public Relations and Media
  • Purchasing and Procurement
  • Quality Management
  • Science and Technical Research and Development
  • Security and Law Enforcement
  • Service Delivery
  • Sport and Recreation
  • Travel, Accommodation, Tourism
  • Wellbeing, Community / Social Services
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

However, the most striking result from the survey, conducted on behalf of the identity verification company Onfido, is that almost a quarter (23 per cent) of respondents said they no longer trust any political content on social media.

On the one hand, this is quite encouraging. People are right to be suspicious of what they see on social media, because social media platforms are unregulated and every piece of content placed on them is free at the point of access but monetised by some other means. Subscribe to a political magazine and you’re the customer; subscribe to an ad-funded political channel and you’re the product being sold.

However, this extends beyond social media. The most recent British Social Attitudes report found that trust in (real) politicians is at the lowest point in the survey’s 41-year history. Last year the Ipsos Veracity Index, which has been running since 1983, also reported a record low in trust in politicians. Real news, fake news – either way, most people feel they are not being told the truth, and when I look at the policy costings in party manifestos I’m inclined to agree.

This will be a particular problem for Labour because if it is going to produce a decade of national renewal, rather than a parliament of national grumbling, it will need to persuade the electorate to trust that its decisions will pay off in the long term. An agile government could boldly regulate social media and the use of AI for political purposes, but the wider problem of mistrust will be much harder to address.

This piece first appeared in the Morning Call newsletter; receive it every morning by subscribing on Substack here.

[See also: Are you ready for Elon Musk to read your mind?]

Content from our partners
Tackling the UK's biggest health challenges
"Heat or eat": how to help millions in fuel poverty – with British Gas Energy Trust
We need an urgent review of UK pensions

Topics in this article :